Linear components of quadratic classifiers

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming

Support vector machine soft margin classifiers are important learning algorithms for classification problems. They can be stated as convex optimization problems and are suitable for a large data setting. Linear programming SVM classifier is specially efficient for very large size samples. But little is known about its convergence, compared with the well understood quadratic programming SVM clas...

متن کامل

Linear classifiers

Above, w ∈ Rd is a vector of real-valued weights, which we call a weight vector, and θ ∈ R is a threshold value. The weight vector (assuming it is non-zero) is perpendicular to a hyperplane of dimension that passes through the point wθ/‖w‖2; this hyperplane separates the points x ∈ Rd that are classified as +1 from those that are classified as −1 by fw,θ. Homogeneous half-space functions are ha...

متن کامل

Cascading Asymmetric Linear Classifiers

Motivation: Combinations of classifiers have been found useful empirically, yet there is no formal proof of their generalization ability. Our goal is to develop an algorithm to train a sequence of linear classifiers yielding a nonlinear decision surface. We believe that choosing asymmetric regularization parameters for each class can yield a sequence of classifiers that approximates arbitrarily...

متن کامل

Bagging for linear classifiers

Classifiers built on small training sets are usually biased or unstable. Different techniques exist to construct more stable classifiers. It is not clear which ones are good, and whether they really stabilize the classifier or just improve the performance. In this paper bagging (bootstrapping and aggregating (1)) is studied for a number of linear classifiers. A measure for the instability of cl...

متن کامل

Learning Mixtures of Linear Classifiers

We consider a discriminative learning (regression) problem, whereby the regression function is a convex combination of k linear classifiers. Existing approaches are based on the EM algorithm, or similar techniques, without provable guarantees. We develop a simple method based on spectral techniques and a ‘mirroring’ trick, that discovers the subspace spanned by the classifiers’ parameter vector...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Advances in Data Analysis and Classification

سال: 2018

ISSN: 1862-5347,1862-5355

DOI: 10.1007/s11634-018-0321-6